Subspace manifold learning with sample weights
نویسندگان
چکیده
Subspace manifold learning represents a popular class of techniques in statistical image analysis and object recognition. Recent research in the field has focused on nonlinear representations; locally linear embedding (LLE) is one such technique that has recently gained popularity. We present and apply a generalization of LLE that introduces sample weights. We demonstrate the application of the technique to face recognition, where a model exists to describe each face’s probability of occurrence. These probabilities are used as weights in the learning of the low-dimensional face manifold. Results of face recognition using this approach are compared against standard nonweighted LLE and PCA. A significant improvement in recognition rates is realized using weighted LLE on a data set where face occurrences follow the modeled distribution. 2007 Elsevier B.V. All rights reserved.
منابع مشابه
Image alignment via kernelized feature learning
Machine learning is an application of artificial intelligence that is able to automatically learn and improve from experience without being explicitly programmed. The primary assumption for most of the machine learning algorithms is that the training set (source domain) and the test set (target domain) follow from the same probability distribution. However, in most of the real-world application...
متن کاملLearning a speech manifold for signal subspace speech denoising
We present a method for learning a low-dimensional manifold for speech from clean speech samples in high-dimensional space. Using this manifold, we perform speech denoising by projecting noisy speech onto the manifold to remove nonspeech components. This method of denoising classifies our algorithm as a signal subspace denoising method, where highdimensional noisy data is projected onto the sig...
متن کاملLinear Subspace and Manifold Learning via Extrinsic Geometry
Linear Subspace and Manifold Learning via Extrinsic Geometry by Brian St. Thomas Department of Statistical Science Duke University Date: Approved: Sayan Mukherjee, Supervisor
متن کاملA Manifold Approach to Learning Mutually Orthogonal Subspaces
Although many machine learning algorithms involve learning subspaces with particular characteristics, optimizing a parameter matrix that is constrained to represent a subspace can be challenging. One solution is to use Riemannian optimization methods that enforce such constraints implicitly, leveraging the fact that the feasible parameter values form a manifold. While Riemannian methods exist f...
متن کاملBayesian Manifold Regression
There is increasing interest in the problem of nonparametric regression with high-dimensional predictors. When the number of predictors D is large, one encounters a daunting problem in attempting to estimate a D-dimensional surface based on limited data. Fortunately, in many applications, the support of the data is concentrated on a d-dimensional subspace with d ≪ D. Manifold learning attempts ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Image Vision Comput.
دوره 27 شماره
صفحات -
تاریخ انتشار 2009